A Flexible, Parallel Generator of Natural Language
نویسنده
چکیده
My Ph.D. thesis (Ward 1992, 1991)1 addressed the task of generating natural language utterances. It was motivated by two difficulties in scaling up existing generators. Current generators only accept input that are relatively poor in information, such as feature structures or lists of propositions; they are unable to deal with input rich in information, as one might expect from, for example, an expert system with a complete model of its domain or a natural language understander with good inference ability. Current generators also have a very restricted knowledge of language— indeed, they succeed largely because they have few syntactic or lexical options available (McDonald 1987)— and they are unable to cope with more knowledge because they deal with interactions among the various possible choices only as special cases. To address these and other issues, I built a system called FIG (flexible incremental generator). FIG is based on a single associative network that encodes lexical knowledge, syntactic knowledge, and world knowledge. Computation is done by spreading activation across the network, supplemented with a small amount of symbolic processing. Thus, FIG is a spreading activation or structured connectionist system (Feldman et al. 1988). In the initial state, some nodes representing concepts are sources of activation; this pattern of activation represents the information to be expressed. Activation flows from these nodes to nodes representing words through the various knowledge structures of the network. When the network settles, the most highly activated word is selected and emitted. Activation levels are then updated to represent the new current state, both in syntactic and semantic aspects. This process of settle, emit, and update repeats until all the input has the subject-predicate construction because it is a verb. FIG’s syntactic coverage is much broader than that of previous connectionist generators such as Gasser (1988); output include “once upon a time there lived an old man and an old woman,” “one day the old woman went to a stream to wash clothes,” and “John ate a peach with an old woman’s fork.” The success of this model in generating utterances of English and Japanese suggests that the complexity present in most treatments of syntax is unnecessary: FIG dispenses with the assembly of syntactic structures, constructions that affect the utterance only by the activation they transmit, directly or indirectly, to words. FIG does without a mechanism for explicit syntactic choice; any number of constructions are potentially active, competing or cooperating in parallel, and the choice among them is emergent. Phenomena traditionally considered to require instantiation and variable binding are handled in FIG with much simpler mechanisms. Grammatical output results not from constraints on the form of syntactic structures or the behavior of an algorithm but, rather, from the structure and weights of the network as a whole. This paragraph summarizes the ways in which FIG addresses the issues that motivated its construction: It handles arbitrarily rich input because the number of nodes activated in the initial state makes no difference to its operation. It handles interaction among choices easily because it tends to settle into a state representing a compatible set of choices as a result of links among nodes that represent such choices. It handles trade-offs among competing goals without additional mechanism because all computation is in terms of numbers. Thus, FIG is the first generator potentially able to perform well at the complex generation tasks that will arise in the future. Of course, to realize this potential requires more experimentation with the details of activation flow and with ways to A Flexible, Parallel Generator of Natural Language
منابع مشابه
Design and Dynamic Modeling of Planar Parallel Micro-Positioning Platform Mechanism with Flexible Links Based on Euler Bernoulli Beam Theory
This paper presents the dynamic modeling and design of micro motion compliant parallel mechanism with flexible intermediate links and rigid moving platform. Modeling of mechanism is described with closed kinematic loops and the dynamic equations are derived using Lagrange multipliers and Kane’s methods. Euler-Bernoulli beam theory is considered for modeling the intermediate flexible link. Based...
متن کاملA Flexible Pragmatics-Driven Language Generator for Animated Agents
This paper describes the NECA MNLG; a fully implemented Multimodal Natural Language Generation module. The MNLG is deployed as part of the NECA system which generates dialogues between animated agents. The generation module supports the seamless integration of full grammar rules, templates and canned text. The generator takes input which allows for the specification of syntactic, semantic and p...
متن کاملA Parallel Approach to Syntax for Generation
To produce good utterances from non-trivial inputs a natural language generator should consider many words in parallel, which raises the question of how to handle syntax in a parallel generator. If a generator is incremental and centered on the task of word choice, then the role of syntax is merely to help evaluate the appropriateness of words. One way to do this is to represent syntactic knowl...
متن کاملDesign and Implementation of an Intelligent Part of Speech Generator
The aim of this paper is to report on an attempt to design and implement an intelligent system capable of generating the correct part of speech for a given sentence while the sentence is totally new to the system and not stored in any database available to the system. It follows the same steps a normal individual does to provide the correct parts of speech using a natural language processor. It...
متن کاملUsing UTAGS for Incremental and Parallel Generation
Exploiting an incremental and parallel processing scheme is useful to improve the performance of natural language generation systems. TAG{GEN is a TAG{based syntactic generator that realizes both principles. It is shown how the demands of incremental and parallel generation innuence the deenition, the design, and the processing of syntactic rules on the basis of Tree Adjoining Grammars.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- AI Magazine
دوره 13 شماره
صفحات -
تاریخ انتشار 1992